9,180 research outputs found

    Essential Constraints of Edge-Constrained Proximity Graphs

    Full text link
    Given a plane forest F=(V,E)F = (V, E) of V=n|V| = n points, we find the minimum set SES \subseteq E of edges such that the edge-constrained minimum spanning tree over the set VV of vertices and the set SS of constraints contains FF. We present an O(nlogn)O(n \log n )-time algorithm that solves this problem. We generalize this to other proximity graphs in the constraint setting, such as the relative neighbourhood graph, Gabriel graph, β\beta-skeleton and Delaunay triangulation. We present an algorithm that identifies the minimum set SES\subseteq E of edges of a given plane graph I=(V,E)I=(V,E) such that ICGβ(V,S)I \subseteq CG_\beta(V, S) for 1β21 \leq \beta \leq 2, where CGβ(V,S)CG_\beta(V, S) is the constraint β\beta-skeleton over the set VV of vertices and the set SS of constraints. The running time of our algorithm is O(n)O(n), provided that the constrained Delaunay triangulation of II is given.Comment: 24 pages, 22 figures. A preliminary version of this paper appeared in the Proceedings of 27th International Workshop, IWOCA 2016, Helsinki, Finland. It was published by Springer in the Lecture Notes in Computer Science (LNCS) serie

    Effect of Variable Selection Strategy on the Performance of Prognostic Models When Using Multiple Imputation

    Get PDF
    BACKGROUND: Variable selection is an important issue when developing prognostic models. Missing data occur frequently in clinical research. Multiple imputation is increasingly used to address the presence of missing data in clinical research. The effect of different variable selection strategies with multiply imputed data on the external performance of derived prognostic models has not been well examined. METHODS AND RESULTS: We used backward variable selection with 9 different ways to handle multiply imputed data in a derivation sample to develop logistic regression models for predicting death within 1 year of hospitalization with an acute myocardial infarction. We assessed the prognostic accuracy of each derived model in a temporally distinct validation sample. The derivation and validation samples consisted of 11524 patients hospitalized between 1999 and 2001 and 7889 patients hospitalized between 2004 and 2005, respectively. We considered 41 candidate predictor variables. Missing data occurred frequently, with only 13% of patients in the derivation sample and 31% of patients in the validation sample having complete data. Regardless of the significance level for variable selection, the prognostic model developed using only the complete cases in the derivation sample had substantially worse performance in the validation sample than did the models for which variables were selected using the multiply imputed versions of the derivation sample. The other 8 approaches to handling multiply imputed data resulted in prognostic models with performance similar to one another. CONCLUSIONS: Ignoring missing data and using only subjects with complete data can result in the derivation of prognostic models with poor performance. Multiple imputation should be used to account for missing data when developing prognostic models

    Validation of a prognostic scoring system for locally recurrent nasopharyngeal carcinoma treated by stereotactic radiosurgery

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Selection of patients with local failure of nasopharyngeal carcinoma (NPC) for appropriate type of salvage treatment can be difficult due to the lack of data on comparative efficacy of different salvage treatments. The purpose of the present study was to validate a previously published prognostic scoring system for local failures of NPC treated by radiosurgery based on reported results in the literature.</p> <p>Methods</p> <p>A literature search yielded 3 published reports on the use of radiosurgery as salvage treatment of NPC that contained sufficient clinical information for validation of the scoring system. Prognostic scores of 18 patients from these reports were calculated and actuarial survival rates were estimated and compared to the original cohort used to design the prognostic scoring system. The area under the receiver operating characteristic curve was also determined and compared between the current and original patient groups.</p> <p>Results</p> <p>The calculated prognostic scores ranged from 0.32 to 1.21, with 15 patients assigned to the poor prognostic group and 3 to the intermediate prognostic group. The actuarial 3-year survival rates in the intermediate and poor prognostic groups were 67% and 0%, respectively. These results were comparable to the observed 3-year survival rates of 74% and 23% in the intermediate and poor prognostic group in the original reports. The area under the receiver operating characteristic curve for the current patient group was 0.846 which was similar to 0.841 in the original group.</p> <p>Conclusion</p> <p>The previously published prognostic scoring system demonstrated good prediction of treatment outcome after radiosurgery in a small group of NPC patients with poor prognosis. Prospective study to validate the scoring system is currently being carried out in our institution.</p

    A Multi-Objective Optimization for Supply Chain Network Using the Bees Algorithm

    Get PDF
    A supply chain is a complex network which involves the products, services and information flows between suppliers and customers. A typical supply chain is composed of different levels, hence, there is a need to optimize the supply chain by finding the optimum configuration of the network in order to get a good compromise between the multi-objectives such as cost minimization and lead-time minimization. There are several multi-objective optimization methods which have been applied to find the optimum solutions set based on the Pareto front line. In this study, a swarm-based optimization method, namely, the bees algorithm is proposed in dealing with the multi-objective supply chain model to find the optimum configuration of a given supply chain problem which minimizes the total cost and the total lead-time. The supply chain problem utilized in this study is taken from literature and several experiments have been conducted in order to show the performance of the proposed model; in addition, the results have been compared to those achieved by the ant colony optimization method. The results show that the proposed bees algorithm is able to achieve better Pareto solutions for the supply chain problem

    The Dropping of In-Medium Hadron Mass in Holographic QCD

    Get PDF
    We study the baryon density dependence of the vector meson spectrum using the D4/D6 system together with the compact D4 baryon vertex. We find that the vector meson mass decreases almost linearly in density at low density for small quark mass, but saturates to a finite non-zero value for large density. We also compute the density dependence of the η\eta\prime mass and the η\eta\prime velocity. We find that in medium, our model is consistent with the GMOR relation up to a few times the normal nuclear density. We compare our hQCD predictions with predictions made based on hidden local gauge theory that is constructed to model QCD.Comment: 20 pages, 7 figure

    Phylo-mLogo: an interactive and hierarchical multiple-logo visualization tool for alignment of many sequences

    Get PDF
    BACKGROUND: When aligning several hundreds or thousands of sequences, such as epidemic virus sequences or homologous/orthologous sequences of some big gene families, to reconstruct the epidemiological history or their phylogenies, how to analyze and visualize the alignment results of many sequences has become a new challenge for computational biologists. Although there are several tools available for visualization of very long sequence alignments, few of them are applicable to the alignments of many sequences. RESULTS: A multiple-logo alignment visualization tool, called Phylo-mLogo, is presented in this paper. Phylo-mLogo calculates the variabilities and homogeneities of alignment sequences by base frequencies or entropies. Different from the traditional representations of sequence logos, Phylo-mLogo not only displays the global logo patterns of the whole alignment of multiple sequences, but also demonstrates their local homologous logos for each clade hierarchically. In addition, Phylo-mLogo also allows the user to focus only on the analysis of some important, structurally or functionally constrained sites in the alignment selected by the user or by built-in automatic calculation. CONCLUSION: With Phylo-mLogo, the user can symbolically and hierarchically visualize hundreds of aligned sequences simultaneously and easily check the changes of their amino acid sites when analyzing many homologous/orthologous or influenza virus sequences. More information of Phylo-mLogo can be found at URL

    Comparison of single versus fractionated dose of stereotactic radiotherapy for salvaging local failures of nasopharyngeal carcinoma: a matched-cohort analysis

    Get PDF
    BACKGROUND: Local failure is an important cause of morbidity and mortality in nasopharyngeal carcinoma (NPC). Although surgery or brachytherapy may be feasible in selected cases, most patients with local failure require external beam re-irradiation. Stereotactic radiation using single or multiple fractions have been employed in re-irradiation of NPC, but the optimal fractionation scheme and dose are not clear. METHODS: Records of 125 NPC patients who received salvage stereotactic radiation were reviewed. A matched-pair design was used to select patients with similar prognostic factors who received stereotactic re-irradiation using single fraction (SRS) or multiple fractions (SRM). Eighty-six patients were selected with equal number in SRS and SRM groups. All patients were individually matched for failure type (persistent or recurrent), rT stage (rT1-2 or rT3-4), and tumor volume (5-10 cc, or >10 cc). Median dose was 12.5 Gy in single fraction by SRS, and 34 Gy in 2-6 fractions by SRM. RESULTS: Local control rate was better in SRM group although overall survival rates were similar. One- and 3-year local failure-free rates were 70% and 51% in SRS group compared with 91% and 83% in SRM group (p = 0.003). One- and 3-year overall survival rates were 98% and 66% in SRS group compared with 78% and 61% in SRM group (p = 0.31). The differences in local control were mainly observed in recurrent or rT2-4 disease. Incidence of severe late complications was 33% in SRS group vs. 21% in SRM group, including brain necrosis (16% vs. 12%) and hemorrhage (5% vs. 2%). CONCLUSION: Our study showed that SRM was superior to SRS in salvaging local failures of NPC, especially in the treatment of recurrent and rT2-4 disease. In patient with local failure of NPC suitable for stereotactic re-irradiation, use of fractionated treatment is preferred.link_to_subscribed_fulltex

    Power-Law Distributions in a Two-sided Market and Net Neutrality

    Full text link
    "Net neutrality" often refers to the policy dictating that an Internet service provider (ISP) cannot charge content providers (CPs) for delivering their content to consumers. Many past quantitative models designed to determine whether net neutrality is a good idea have been rather equivocal in their conclusions. Here we propose a very simple two-sided market model, in which the types of the consumers and the CPs are {\em power-law distributed} --- a kind of distribution known to often arise precisely in connection with Internet-related phenomena. We derive mostly analytical, closed-form results for several regimes: (a) Net neutrality, (b) social optimum, (c) maximum revenue by the ISP, or (d) maximum ISP revenue under quality differentiation. One unexpected conclusion is that (a) and (b) will differ significantly, unless average CP productivity is very high

    Algorithms for Colourful Simplicial Depth and Medians in the Plane

    Full text link
    The colourful simplicial depth of a point x in the plane relative to a configuration of n points in k colour classes is exactly the number of closed simplices (triangles) with vertices from 3 different colour classes that contain x in their convex hull. We consider the problems of efficiently computing the colourful simplicial depth of a point x, and of finding a point, called a median, that maximizes colourful simplicial depth. For computing the colourful simplicial depth of x, our algorithm runs in time O(n log(n) + k n) in general, and O(kn) if the points are sorted around x. For finding the colourful median, we get a time of O(n^4). For comparison, the running times of the best known algorithm for the monochrome version of these problems are O(n log(n)) in general, improving to O(n) if the points are sorted around x for monochrome depth, and O(n^4) for finding a monochrome median.Comment: 17 pages, 8 figure

    Cooper pairing near charged black holes

    Full text link
    We show that a quartic contact interaction between charged fermions can lead to Cooper pairing and a superconducting instability in the background of a charged asymptotically Anti-de Sitter black hole. For a massless fermion we obtain the zero mode analytically and compute the dependence of the critical temperature T_c on the charge of the fermion. The instability we find occurs at charges above a critical value, where the fermion dispersion relation near the Fermi surface is linear. The critical temperature goes to zero as the marginal Fermi liquid is approached, together with the density of states at the Fermi surface. Besides the charge, the critical temperature is controlled by a four point function of a fermionic operator in the dual strongly coupled field theory.Comment: 1+33 pages, 4 figure
    corecore